Convergence Estimates for Preconditioned Gradient Subspace Iteration Eigensolvers
نویسندگان
چکیده
Subspace iteration for computing several eigenpairs (i.e. eigenvalues and eigenvectors) of an eigenvalue problem is an alternative to the deflation technique whereby the eigenpairs are computed successively by projecting the problem onto the subspace orthogonal to the already found eigenvectors. The main advantage of the subspace iteration over the deflation is its ‘cluster robustness’: even if some of the computed eigenvalues form a cluster (i.e. are very close to each other), the convergence does not deteriorate. For standard subspace iteration eigensolvers the above fact is well-known, and it is supported by convergence estimates. This paper tackles the so-called preconditioned gradient subspace iteration eigensolvers – a relatively new class of methods designed to efficiently compute several extreme eigenpairs of large-scale eigenvalue problems. Using a new approach to the convergence analysis of subspace iterations, based on dealing with eigenvalue sums rather than individual eigenvalues, the paper presents new convergence results for a class of preconditioned gradient subspace iteration eigensolvers which are fully cluster robust, i.e. involve the distances between the eigenvalues in a cluster neither in the assumptions nor in the estimates themselves.
منابع مشابه
On preconditioned eigensolvers and Invert-Lanczos processes
This paper deals with the convergence analysis of various preconditioned iterations to compute the smallest eigenvalue of a discretized self-adjoint and elliptic partial differential operator. For these eigenproblems several preconditioned iterative solvers are known, but unfortunately, the convergence theory for some of these solvers is not very well understood. The aim is to show that precond...
متن کاملConvergence Analysis of Restarted Krylov Subspace Eigensolvers
The A-gradient minimization of the Rayleigh quotient allows to construct robust and fastconvergent eigensolvers for the generalized eigenvalue problem for (A,M) with symmetric and positive definite matrices. The A-gradient steepest descent iteration is the simplest case of more general restarted Krylov subspace iterations for the special case that all step-wise generated Krylov subspaces are tw...
متن کاملSharp Ritz Value Estimates for Restarted Krylov Subspace Iterations
Gradient iterations for the Rayleigh quotient are elemental methods for computing the smallest eigenvalues of a pair of symmetric and positive definite matrices. A considerable convergence acceleration can be achieved by preconditioning and by computing Rayleigh-Ritz approximations from subspaces of increasing dimensions. An example of the resulting Krylov subspace eigensolvers is the generaliz...
متن کاملThe Block Preconditioned Steepest Descent Iteration for Elliptic Operator Eigenvalue Problems
The block preconditioned steepest descent iteration is an iterative eigensolver for subspace eigenvalue and eigenvector computations. An important area of application of the method is the approximate solution of mesh eigenproblems for self-adjoint and elliptic partial differential operators. The subspace iteration allows to compute some of the smallest eigenvalues together with the associated i...
متن کاملGradient Flow Approach to Geometric Convergence Analysis of Preconditioned Eigensolvers
Preconditioned eigenvalue solvers (eigensolvers) are gaining popularity, but their convergence theory remains sparse and complex. We consider the simplest preconditioned eigensolver— the gradient iterative method with a fixed step size—for symmetric generalized eigenvalue problems, where we use the gradient of the Rayleigh quotient as an optimization direction. A sharp convergence rate bound fo...
متن کامل